Meta says advertisers must disclose when a social issue, election, or political ad on Facebook or Instagram has been “digitally created or altered,” or by AI.AI 

Meta’s Move: Helping India Fight Fake News with Political Ad Disclosure Policy

Earlier this week, India’s Minister of State for Electronics and Information Technology Rajeev Chandrasekhar slammed social media platforms for their inability to deal with deeply fake content and warned them of consequences if they do not remove reported fake information within 36 hours of notification. .

This happened after actress Rashmika Mandanna’s recent deep fake controversy. A deepfake about him was spread on social media, and several big-name celebrities stood up to support the actor and opposed AI-driven deepfakes.

Now, while it may just be a coincidence, but coupled with the deep fake controversy that the Indian government invited and with state elections in Rajasthan, Madhya Pradesh and other states coming up, Meta has swung into action and announced that starting in the new year, it plans to take steps to expose , when a social issue, election or political ad on Meta platforms like Facebook or Instagram has been “digitally trusted or altered” or using artificial intelligence. This policy is applied worldwide.

“Advertisers must report any time a social issue, election or political ad contains a photo-realistic image or video or realistic-sounding audio that has been digitally created or altered,” Meta said. Anything that depicts a real person saying or doing something they didn’t say or do, or a realistic-looking person who doesn’t exist, or a realistic-looking event that didn’t happen, or alters material from what actually happened, and portraying a realistic event that alleged to have occurred, but which is not a real photo, video or audio recording of the event – is covered by this order.

In addition, Meta added that advertisers may not disclose digital changes in an ad if the changes are minor and do not significantly change the ad’s message. These small adjustments can include resizing the image, cropping it, or fine-tuning elements such as color or brightness. However, if the changes are major and affect the message, advertisers must report these changes.

If the advertiser does not disclose all relevant information, Meta will reject his ad. And if they keep it up, the Meta can issue penalties.

A “legal” reminder by the Government of India

Whether these changes are the result of continued AI-based disinformation in India and globally, whether through recent deepfakes or the Israel-Hamas conflict, cannot be ascertained. However, it is clear that the Indian government will impose stricter rules on social media platforms under the new IT rules that will be implemented in India in April 2023.

On Monday, earlier this week, Rajeev Chandrasekahr reminded platforms that they must “ensure that no user publishes false information” and must also “ensure that false information is removed within 36 hours of being reported by the user or the government.” If any platform fails to comply with these mandates, “Rule 7 will apply and the aggrieved person may take the platforms to court under the provisions of the IPC.”

One more thing! ReturnByte is now on WhatsApp channels! Click here to join so that you never miss any updates from the world of technology.

Related posts

Leave a Comment